Spiking and saturating dendrites differentially expand single neuron computation capacity

نویسندگان

  • Romain D. Cazé
  • Mark D. Humphries
  • Boris S. Gutkin
چکیده

The integration of excitatory inputs in dendrites is non-linear: multiple excitatory inputs can produce a local depolarization departing from the arithmetic sum of each input’s response taken separately. If this depolarization is bigger than the arithmetic sum, the dendrite is spiking; if the depolarization is smaller, the dendrite is saturating. Decomposing a dendritic tree into independent dendritic spiking units greatly extends its computational capacity, as the neuron then maps onto a two layer neural network, enabling it to compute linearly non-separable Boolean functions (lnBFs). How can these lnBFs be implemented by dendritic architectures in practise? And can saturating dendrites equally expand computational capacity? To address these questions we use a binary neuron model and Boolean algebra. First, we confirm that spiking dendrites enable a neuron to compute lnBFs using an architecture based on the disjunctive normal form (DNF). Second, we prove that saturating dendrites as well as spiking dendrites enable a neuron to compute lnBFs using an architecture based on the conjunctive normal form (CNF). Contrary to a DNF-based architecture, in a CNF-based architecture, dendritic unit tunings do not imply the neuron tuning, as has been observed experimentally. Third, we show that one cannot use a DNF-based architecture with saturating dendrites. Consequently, we show that an important family of lnBFs implemented with a CNF-architecture can require an exponential number of saturating dendritic units, whereas the same family implemented with either a DNF-architecture or a CNF-architecture always require a linear number of spiking dendritic units. This minimization could explain why a neuron spends energetic resources to make its dendrites spike.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Passive Dendrites Enable Single Neurons to Compute Linearly Non-separable Functions

Local supra-linear summation of excitatory inputs occurring in pyramidal cell dendrites, the so-called dendritic spikes, results in independent spiking dendritic sub-units, which turn pyramidal neurons into two-layer neural networks capable of computing linearly non-separable functions, such as the exclusive OR. Other neuron classes, such as interneurons, may possess only a few independent dend...

متن کامل

Building blocks for electronic spiking neural networks

We present an electronic circuit modelling the spike generation process in the biological neuron. This simple circuit is capable of simulating the spiking behaviour of several different types of biological neurons. At the same time, the circuit is small so that many neurons can be implemented on a single silicon chip. This is important, as neural computation obtains its power not from a single ...

متن کامل

Temporal sequence detection with spiking neurons: towards recognizing robot language instructions

We present an approach for recognition and clustering of spatio temporal patterns based on networks of spiking neurons with active dendrites and dynamic synapses. We introduce a new model of an integrate-andfire neuron with active dendrites and dynamic synapses (ADDS) and its synaptic plasticity rule. The neuron employs the dynamics of the synapses and the active properties of the dendrites as ...

متن کامل

Digital Neuron Cells for Highly Parallel Cognitive Systems

The biophysically-meaningful neuron models can be used to simulate human brain behavior. The understanding of neuron behaviours is expected to have prominent role in the fields such as artificial intelligence, treatments of damaged brain, etc. Mostly, the high level of realism of spiking neuron networks and their complexity require a considerable computational resources limiting the size of the...

متن کامل

Capacity of a Single Spiking Neuron Channel

Information transfer through a single neuron is a fundamental component of information processing in the brain, and computing the information channel capacity is important to understand this information processing. The problem is difficult since the capacity depends on coding, characteristics of the communication channel, and optimization over input distributions, among other issues. In this le...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012